Lectures on Probability, Entropy, and Statistical Physics
نویسنده
چکیده
Preface Science consists in using information about the world for the purpose of predicting , explaining, understanding, and/or controlling phenomena of interest. The basic difficulty is that the available information is usually insufficient to attain any of those goals with certainty. In these lectures we will be concerned with the problem of inductive inference , that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty? Or, at least, are there rules that could in principle be followed by an ideally rational agent when discussing scientific matters? What makes one statement more plausible than another? How much more plausible? And then, when new information is acquired how does it change its mind? Or, to put it differently, are there rules for learning? Are there rules for processing information that are objective and consistent? Are they unique? And, come to think of it, what, after all, is information? It is clear that data " contains " or " conveys " information, but what does this precisely mean? Can information be conveyed in other ways? Is information some sort of physical fluid that can be contained or transported? Is information physical ? Can we measure amounts of information? Do we need to? Our goal is to develop the main tools for inductive inference – probability and entropy – and to illustrate their use in physics. To be specific we will concentrate on examples borrowed from the foundations of classical statistical physics, but this is not meant to reflect a limitation of these inductive methods, which, as far as we can tell at present are of universal applicability. It is just that statistical mechanics is rather special in that it provides us with the first examples of fundamental laws of physics that can be derived as examples of inductive inference. Perhaps all laws of physics can be derived in this way. The level of these lectures is somewhat uneven. Some topics are fairly advanced – the subject of recent research – while some other topics are very elementary. I can give two related reasons for including the latter. First, the standard education of physicists includes a very limited study of probability and even of entropy – maybe just a little about errors in a laboratory course, or maybe a couple of lectures as a brief mathematical prelude to statistical mechanics. The result is a widespread …
منابع مشابه
Geometric & Probabilistic Aspects of Boson Lattice Models
This review describes quantum systems of bosonic particles moving on a lattice. These models are relevant in statistical physics, and have natural ties with probability theory. The general setting is recalled and the main questions about phase transitions are addressed. A lattice model with Lennard-Jones potential is studied as an example of a system where rst-order phase transitions occur. A m...
متن کاملAssessment of Anesthesia Depth Using Effective Brain Connectivity Based on Transfer Entropy on EEG Signal
Introduction: Ensuring an adequate Depth of Anesthesia (DOA) during surgery is essential for anesthesiologists. Since the effect of anesthetic drugs is on the central nervous system, brain signals such as Electroencephalogram (EEG) can be used for DOA estimation. Anesthesia can interfere among brain regions, so the relationship among different areas can be a key factor in the anesthetic process...
متن کاملBlack Holes and String Theory
This is a short summary of my lectures given at the Fourth Mexican School on Gravitation and Mathematical Physics. These lectures gave a brief introduction to black holes in string theory, in which I primarily focussed on describing some of the recent calculations of black hole entropy using the statistical mechanics of D-brane states. The following overview will also provide the interested stu...
متن کاملShannon entropy: a rigorous mathematical notion at the crossroads between probability, information theory, dynamical systems and statistical physics
Statistical entropy was introduced by Shannon as a basic concept in information theory, measuring the average missing information on a random source. Extended into an entropy rate, it gives bounds in coding and compression theorems. I here present how statistical entropy and entropy rate relate to other notions of entropy, relevant either to probability theory (entropy of a discrete probability...
متن کاملExotic statistical physics: Applications to biology, medicine, and economics
This manuscript is based on four opening lectures, which were designed to o er a brief and somewhat parochial overview of some “exotic” statistical physics puzzles of possible interest to biophysicists, medical physicists, and econophysicists. These include the statistical properties of DNA sequences, heartbeat intervals, brain plaques in Alzheimer brains, and uctuations in economics. These pro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/0808.0012 شماره
صفحات -
تاریخ انتشار 2008